Temporal factors in the electrophysiological markers of audiovisual speech integration
نویسندگان
چکیده
Recent research had shown that concurrent visual speech modulates the cortical event-related potential N1/P2 to auditory speech. Audiovisually presented speech results in an N1-P2 that is reduced in peak amplitude and with shorter peak latencies than unimodal auditory speech [11]. This effect on the N1/P2 is consistent with a model in which visual speech integrates with auditory speech at an early processing stage in the auditory cortex by suppressing auditory cortical activity. We examined the effects of audiovisual temporal synchrony in producing modulations in the N1/P2. With the visual stream presented in synchrony with the auditory stream our results replicated the basic findings of reduced peak amplitudes in the N1/P2 compared to a unimodal auditory condition. With the visual stream temporally mismatched with the auditory stream (so that the auditory speech signal was presented 200 ms before its recorded position) the recorded N1/P2 was similar to unimodal auditory speech. The results are discussed in terms of Wassenhove’s ‘analysis-bysynthesis model’ of audiovisual integration.
منابع مشابه
Physical and perceptual factors shape the neural mechanisms that integrate audiovisual signals in speech comprehension.
Face-to-face communication challenges the human brain to integrate information from auditory and visual senses with linguistic representations. Yet the role of bottom-up physical (spectrotemporal structure) input and top-down linguistic constraints in shaping the neural mechanisms specialized for integrating audiovisual speech signals are currently unknown. Participants were presented with spee...
متن کاملSpatial and temporal factors during processing of audiovisual speech: a PET study.
Speech perception can use not only auditory signals, but also visual information from seeing the speaker's mouth. The relative timing and relative location of auditory and visual inputs are both known to influence crossmodal integration psychologically, but previous imaging studies of audiovisual speech focused primarily on just temporal aspects. Here we used Positron Emission Tomography (PET) ...
متن کاملMusic expertise shapes audiovisual temporal integration windows for speech, sinewave speech, and music
This psychophysics study used musicians as a model to investigate whether musical expertise shapes the temporal integration window for audiovisual speech, sinewave speech, or music. Musicians and non-musicians judged the audiovisual synchrony of speech, sinewave analogs of speech, and music stimuli at 13 audiovisual stimulus onset asynchronies (±360, ±300 ±240, ±180, ±120, ±60, and 0 ms). Furth...
متن کاملAudiovisual speech integration: modulatory factors and the link to sound symbolism
In this talk, I will review some of the latest findings from the burgeoning literature on the audiovisualintegration of speech stimuli. I will focus on those factors that have been demonstrated to influence thisform of multisensory integration (such as temporal coincidence, speaker/gender matching, andattention; Vatakis & Spence, 2007, 2010). I will also look at a few of the oth...
متن کاملElectrophysiological evidence for speech-specific audiovisual integration.
Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration, and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007